Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Front Psychol ; 7: 355, 2016.
Article in English | MEDLINE | ID: mdl-27014161

ABSTRACT

Studies have contended that neurotypical Japanese individuals exhibit consistent color-shape associations (red-circle, yellow-triangle, and blue-square) and those color-shape associations could be constructed by common semantic information between colors and shapes through learning and/or language experiences. Here, we conducted two experiments using a direct questionnaire survey and an indirect behavioral test (Implicit Association Test), to examine whether the construction of color-shape associations entailed phonological information by comparing color-shape associations in deaf and hearing participants. The results of the direct questionnaire showed that deaf and hearing participants had similar patterns of color-shape associations (red-circle, yellow-triangle, and blue-square). However, deaf participants failed to show any facilitated processing of congruent pairs in the IAT tasks as hearing participants did. The present results suggest that color-shape associations in deaf participants may not be strong enough to be proved by the indirect behavior tasks and relatively weaker in comparison to hearing participants. Thus, phonological information likely plays a role in the construction of color-shape associations.

2.
Front Psychol ; 6: 383, 2015.
Article in English | MEDLINE | ID: mdl-25883582

ABSTRACT

In interpreting verbal messages, humans use not only verbal information but also non-verbal signals such as facial expression. For example, when a person says "yes" with a troubled face, what he or she really means appears ambiguous. In the present study, we examined how deaf and hearing people differ in perceiving real meanings in texts accompanied by representations of facial expression. Deaf and hearing participants were asked to imagine that the face presented on the computer monitor was asked a question from another person (e.g., do you like her?). They observed either a realistic or a schematic face with a different magnitude of positive or negative expression on a computer monitor. A balloon that contained either a positive or negative text response to the question appeared at the same time as the face. Then, participants rated how much the individual on the monitor really meant it (i.e., perceived earnestness), using a 7-point scale. Results showed that the facial expression significantly modulated the perceived earnestness. The influence of positive expression on negative text responses was relatively weaker than that of negative expression on positive responses (i.e., "no" tended to mean "no" irrespective of facial expression) for both participant groups. However, this asymmetrical effect was stronger in the hearing group. These results suggest that the contribution of facial expression in perceiving real meanings from text messages is qualitatively similar but quantitatively different between deaf and hearing people.

3.
Neurosci Res ; 90: 83-9, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25498951

ABSTRACT

The phonological abilities of congenitally deaf individuals are inferior to those of people who can hear. However, deaf individuals can acquire spoken languages by utilizing orthography and lip-reading. The present study used functional magnetic resonance imaging (fMRI) to show that deaf individuals utilize phonological representations via a mnemonic process. We compared the brain activation of deaf and hearing participants while they memorized serially visually presented Japanese kana letters (Kana), finger alphabets (Finger), and Arabic letters (Arabic). Hearing participants did not know which finger alphabets corresponded to which language sounds, whereas deaf participants did. All of the participants understood the correspondence between Kana and their language sounds. None of the participants knew the correspondence between Arabic and their language sounds, so this condition was used as a baseline. We found that the left superior temporal gyrus (STG) was activated by phonological representations in the deaf group when memorizing both Kana and Finger. Additionally, the brain areas associated with phonological representations for Finger in the deaf group were the same as the areas for Kana in the hearing group. Overall, despite the fact that they are superior in visual information processing, deaf individuals utilize phonological rather than visual representations in visually presented verbal memory.


Subject(s)
Brain/physiopathology , Deafness/physiopathology , Magnetic Resonance Imaging , Mental Processes/physiology , Reading , Visual Perception , Adolescent , Adult , Brain Mapping , Female , Humans , Image Processing, Computer-Assisted/methods , Magnetic Resonance Imaging/methods , Male , Memory , Persons With Hearing Impairments , Young Adult
4.
PLoS One ; 6(2): e16919, 2011 Feb 16.
Article in English | MEDLINE | ID: mdl-21359223

ABSTRACT

Knowing where people look when viewing faces provides an objective measure into the part of information entering the visual system as well as into the cognitive strategy involved in facial perception. In the present study, we recorded the eye movements of 20 congenitally deaf (10 male and 10 female) and 23 (11 male and 12 female) normal-hearing Japanese participants while they evaluated the emotional valence of static face stimuli. While no difference was found in the evaluation scores, the eye movements during facial observations differed among participant groups. The deaf group looked at the eyes more frequently and for longer duration than the nose whereas the hearing group focused on the nose (or the central region of face) more than the eyes. These results suggest that the strategy employed to extract visual information when viewing static faces may differ between deaf and hearing people.


Subject(s)
Deafness/physiopathology , Eye Movements/physiology , Face/physiology , Fixation, Ocular/physiology , Persons With Hearing Impairments , Adult , Asian People , Attention/physiology , Deafness/congenital , Emotions/physiology , Facial Expression , Female , Humans , Male , Pattern Recognition, Visual/physiology , Photic Stimulation , Visual Perception/physiology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...